Web Survey Bibliography
Title Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment
Author Hoehne, J. K.; Schlosser, S.; Krebs, D.
Year 2016
Access date 03.08.2016
Full text PDF (1,01MB)
Abstract
Relevance and Research Question: The measurement of attitudes, opinions, and behaviors with agree/disagree questions is a common and popular method in empirical social research. For instance, in the German General Social Survey, the Eurobarometer, and the ISSP this question type is frequently used. Fowler (1995), however, suggests that agree/disagree questions require an effortful and intricate cognitive information processing; he argues for the use of item-specific questions because they seem to be less burdensome. So far, this assumption lacks empirical evidence.
Methods and Data: In the current study, we examine cognitive burden of agree/disagree and item-specific questions in web surveys using paradata. Measuring response times makes it possible to examine the cognitive burden of different question types and provides insights into cognitive response processes. We used an innovative double-stage outlier correction that is based on the activity of the web survey while processing, followed by an outlier definition that is based on the distribution of the response times. Additionally, we captured computer mouse clicks to evaluate response times. We conducted a two group experiment that is based on an onomastic sampling approach. The first experimental group (n = 533) received eight agree/disagree questions on achievement motivation. The second group (n = 472) received eight similar item-specific questions on achievement motivation.
Results: Our findings suggest that the item-specific questions show, on average, significantly higher response times than their agree/disagree counterparts. The computer mouse clicks, however, show no significant differences between the two experimental groups. Hence, the question types do not seem to affect response times systematically.
Added Value: Altogether, it appears that item-specific questions contrary to the current state of research require a deeper cognitive information processing than agree/disagree questions.
Methods and Data: In the current study, we examine cognitive burden of agree/disagree and item-specific questions in web surveys using paradata. Measuring response times makes it possible to examine the cognitive burden of different question types and provides insights into cognitive response processes. We used an innovative double-stage outlier correction that is based on the activity of the web survey while processing, followed by an outlier definition that is based on the distribution of the response times. Additionally, we captured computer mouse clicks to evaluate response times. We conducted a two group experiment that is based on an onomastic sampling approach. The first experimental group (n = 533) received eight agree/disagree questions on achievement motivation. The second group (n = 472) received eight similar item-specific questions on achievement motivation.
Results: Our findings suggest that the item-specific questions show, on average, significantly higher response times than their agree/disagree counterparts. The computer mouse clicks, however, show no significant differences between the two experimental groups. Hence, the question types do not seem to affect response times systematically.
Added Value: Altogether, it appears that item-specific questions contrary to the current state of research require a deeper cognitive information processing than agree/disagree questions.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (272)
- Usability Testing for Survey Research; 2017; Geisen, E.; Romano Bergstrom, J. C.
- A Case Study on Evaluating the Relevance of Some Rules for Writing Requirements through an Online Survey...; 2017; Warnier, M.; Condamines, A.
- Web Health Monitoring Survey: A New Approach to Enhance the Effectiveness of Telemedicine Systems ; 2016; Romano, M. F.; Sardella, M. V.; Alboni, F.
- Design and test of a web-survey for collecting observer’s ratings on dairy goats’ behavioural...; 2016; Vieira, A.; Oliveira, M. D.; Nunes, T.; Stilwell, G.
- A look at the unique data-gathering process behind the Harvard Impact Study; 2016; Vitale, J.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Gamification of Online Surveys: Design Process, Case Study, and Evaluation; 2015; Harms, J.; Biegler, S.; Wimmer, C.; Kappel, K.; Grechenig, T.
- Finding Item Nonresponse Patterns: Three Internet Survey Experiments Into the Effects of Nonresponse...; 2015; Van De Maat, J.
- The Effects of Adding a Mobile-Compatible Design to the American Life Panel; 2015; Toepoel, V.; Lugtig, P. J.; Amin, A.
- Technology and Reporting of Daily Activities – Considerations for Analysis of Behaviours in Mixed...; 2015; Fisher, K.; Gershuny, J.
- Cheating in web surveys. Evidence from a split-ballot repeated experiment on knowledge questions on...; 2015; Ladini, R.; Vezzoni, C.
- Unplanned use of mobile devices in a probabilistic online panel survey: Patterns of use and implications...; 2015; Poggio, T.; Bosnjak, M.; Bandilla, W.; Weyandt, K.
- The importance of scale direction between different modes; 2015; Agalioti-sgompou, V.
- Examining the Impact of Mobile First and Responsive Web Design on Desktop and Mobile Respondents; 2015; Tharp, D.
- Boosting Probability-Based Web Survey Response Rates via Nonresponse Follow-Up; 2015; Chew, K.; Fontes, A.; Lavrakas, P. J.
- Cognitive Testing of Survey Translations: Does Respondent Language Proficiency Matter?; 2015; Schoua-Glusberg, A.; Park, H.; Meyer, M.; Goerman, P. L.; Sha, M.
- Questionnaire length and breakoffs in web surveys: a meta study; 2014; Vehovar, V., Cehovin, G.
- Inside the Turk Understanding Mechanical Turk as a Participant Pool; 2014; Paolacci, G., Chandler, J.
- Social Media and Online Survey: Tools for Knowledge Management in Health Research ; 2014; Merolli, M., Sanchez, F. J. M., Gray, K.
- Development and validation of a single- item scale for the relative assessment of physical attractiveness...; 2013; Lutz, J.; Kemper, C. J.; Beierlein, C.; etc.
- A standard with quality indicators for web panel surveys: a Swedish example; 2013; Nyfjaell, M.
- Developing a New Mixed-Mode Methodology For a Provincial Park Camper Survey in British Columbia; 2013; Dyck, B. W.
- Scientific impact of the MESS Project: A brief overview; 2013; Das, M.
- Using the iPad as a Prize-Based Incentive to Boost Response Rates: A Case Study at Brigham Young University...; 2013; McClendon, R., Olsen, D.
- Using Qualitative and Quantitative Testing to Improve Hispanic Response Rates for Online Surveys; 2013; Pens, Y., Gentry, R. J.
- The ONS Beyond 2011 Programme & possible implications for social surveys; 2013; Morris, L.
- Survey Research; 2013; Abbott, M. L., McKinney, J.
- The effect of short formative diagnostic web quizzes with minimal feedback; 2013; Baelter, O., Enstroem, E., Klingenberg, B.
- Web CATI (Part of NatCen’s Multi-Mode Approach) ; 2012; Damestani, P., Agur, M.
- What is Online Research?: Using the Internet for Social Science Research; 2012; Hooley, H., Wellens, J., Marriott, J.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V., Cehovin, G., Kavcic, L., Lenar, J.
- What Survey Modes are Most Effective in Eliciting Self-Reports of Criminal or Delinquent Behavior?; 2012; Kleck, G., Roberts, K.
- Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys; 2012; Behr, D., Braun, M., Kaczmirek, L.
- Adaptive web sampling in ecology; 2012; Thompson, S. K.
- Online Data Collection in the Agro-Food Sector; 2012; Biffignandi, S., Artaz, R.
- Psychometric properties of an internet administered version of the Marlowe-Crowne Social Desirability...; 2012; Vesteinsdottir, V., Reips, U.-D., Joinson, A. N., Porsdottir, F.
- Research design for studying online communities with web surveys; 2012; Petrovcic, A., Petric, G., Lozar Manfreda, K.
- Case study: Respondent perspective on survey response; 2012; Jarrett, C.
- Presidential Elections in Iceland 2012 – Did online panel surveys give false hope to new candidates...; 2012; Jonsdottir, G. A., Dofradottir, A. G., Bjornsdottir, A. E.
- Internet Mobility Survey Sampling Biases in Measuring Frequency of Use of Transport Modes ; 2012; Diana, M.
- Qualitatively Speaking: Mobile qualitative finally hits its stride; 2012; Bryson, J.
- Using Collaborative Web Technology to Construct the Health Information National Trends Survey; 2012; Moser, R. P., Beckjord, E. B., Finney Rutten, L. J., Blake, K., Hesse, B. W.
- A Shot in the Dark: Measurement Influence on Likelihood to Vaccination; 2012; Higgins, W. B., Thomas, R. K.
- Using Online Panels for National Surveys of Low Incidence Populations: Findings from the CDC Influenza...; 2012; Boyle, J., Ball, S., Ding, H., Srinath, K. P., Euler, G.
- Drop-Off Point for Undergraduate Students on a Web-based Alcohol and Tobacco Use Questionnaire; 2012; Mitra, A.
- An Examination of the 2010 Census Be Counted Program and Its Effects on Census Coverage and Duplication...; 2012; Jackson, G. I., Wechter, K. M.
- The Detection and Effects of Data From Potentially Ineligible Participants in Online Survey Research...; 2012; Grey, J.
- Internet Mobility Survey Sampling Biases in Measuring Frequency of Use of Different Transport Modes; 2012; Diana, M.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.